Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[air] large tune/torch benchmark #26763

Merged
merged 3 commits into from
Jul 23, 2022

Conversation

richardliaw
Copy link
Contributor

Why are these changes needed?

Test distributed hyperparameter tuning vs distributed training to make sure there is no regression in performance when running multiple concurrently.

Related issue number

Checks

  • I've run scripts/format.sh to lint the changes in this PR.
  • I've included any doc changes needed for https://docs.ray.io/en/master/.
  • I've made sure the tests are passing. Note that there might be a few flaky tests, see the recent failures at https://flakey-tests.ray.io/
  • Testing Strategy
    • Unit tests
    • Release tests
    • This PR is not tested :(

Signed-off-by: Richard Liaw <[email protected]>
@richardliaw richardliaw added this to the Ray AIR milestone Jul 20, 2022
Copy link
Contributor

@krfricke krfricke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, let's decide on the frequency though

release/release_tests.yaml Outdated Show resolved Hide resolved
@richardliaw richardliaw merged commit 96e8027 into ray-project:master Jul 23, 2022
@richardliaw richardliaw deleted the bigger-test-torch-tune branch July 23, 2022 08:17
Rohan138 pushed a commit to Rohan138/ray that referenced this pull request Jul 28, 2022
Stefan-1313 pushed a commit to Stefan-1313/ray_mod that referenced this pull request Aug 18, 2022
Co-authored-by: Kai Fricke <[email protected]>
Signed-off-by: Stefan van der Kleij <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants